A LOCAL MINIMAX LOWER BOUNDFOR NONPARAMETRIC DECONVOLUTIONSecond

نویسنده

  • A J Van Es
چکیده

Suppose we have i.i.d. observations with a distribution equal to the convolution of an unknown distribution function F and a known distribution function K. We derive a local minimax lower bound for the problem of estimating F at a xed point x 0. We focus on the dependence of the lower bound on the smoothness of the distribution K. The performance with respect to the lower bound of nonparametric maximum likelihood estimators and of kernel type estimators is reviewed. Suppose we have observations X 1 ; : : : ; X n which are independent. The distribution function of these observations is assumed to be equal to the convolution of an unknown distribution function F and a known distribution function K. Examples of such observations are for instance random measurements plus independent measurement errors with a known distribution, or ages of disease diagnosis as the sum of ages of infection plus independent incubation periods with a known distribution. In the rst example K will typically correspond to a symmetric smooth distribution , while in the second example less{smooth distributions like the exponential and gamma distributions are more plausible. We will investigate the dependence of the hardness of the problem of estimating F on this smoothness , to give yet another quantiication of the phrase \the smoother the known distribution K the harder the decon-volution problem". We derive a local minimax lower bound for the problem of estimating F at a xed point x 0 , i.e. a lower bound for the minimal value over all estimators of the maximal expected absolute error in two cases of underlying distributions. One is a xed distribution F and the other is a local perturbation of F, which as n increases approaches the xed F. Related global lower bounds, where the expected absolute error is maximized over a xed set of distributions with a certain minimal smoothness, i.e. having at least a continuous density, have been derived by Carroll and Hall (1988) and Fan (1991). Our perturbations will have bounded densities with discontinuities so our result corresponds to a situation where no smoothness of F further than the existence of a bounded density can be assumed. First we construct the perturbations of F. Deene the higher order diierences of a function u as follows (((0) t u)(x) = u(x); (1) (((l+1) t u)(x) = (((l) t u)(x + t) ? (((l) t u)(x ? …

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Minimax Lower Bounds for Nonparametric Deconvolution Report 98{25

Suppose we have i.i.d. observations with a distribution equal to the convolution of an unknown distribution function F and a known distribution function K. We derive local minimax lower bounds for the problem of estimating F , its density f and its derivatives at a xed point x 0. Contrary to a previous local minimax bound in Van Es (1998) only smooth perturbations are considered. The local boun...

متن کامل

Tradeoffs Between Global And Local Risks In Nonparametric Function Estimation

We investigate the possibility of finding multi-purpose nonparametric function estimators that are minimax rate optimal for estimating the entire function and are also minimax rate optimal for point estimation. We show that it is impossible to attain the global optimal rate of convergence as well as the local optimal rate at every point. An inequality concerning estimation in a two parameter st...

متن کامل

Minimax Risk Bounds in Extreme Value Theory

Asymptotic minimax estimators of a positive extreme value index under zero-one loss are investigated in the classical i.i.d. setup. To this end, we prove the weak convergence of suitable local experiments with Pareto distributions as center of localization to a white noise model, which was previously studied in the context of nonparametric local density estimation and regression. From this resu...

متن کامل

Adaptively local I - dimensional subproblems

We provide a new insight of the difficulty of nonparametric estimation of a whole function. A new method is invented for finding a minimax lower bound of globally estimating a function. The idea is to adjust automatically the direction to the nearly hardest I-dimensional subproblem at each location, and to use locally the difficulty of I-dimensional subproblem. In a variety of contexts, our met...

متن کامل

A Remedy to Regression Estimators and Nonparametric Minimax Efficiency

It is known that both Watson-Nadaraya and Gasser-Muller types of regression estimators have some disadvantages. A smooth version of local polynomial regression estimators are proposed to remedy the disadvantages. The mean squared error and mean integrated squared errors are computed explicitly. It turns out that by suitably selecting a kernel and a bandwidth, the proposed estimator has at least...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998